Microsoft unveils a new small language model | MarTech
Microsoft introduces cost-effective small language models, starting with Phi-3-Mini with 3.8 billion parameters and trained on 3.3 trillion tokens. [ more ]
Open Data Science - Your News Source for AI, Machine Learning & more
Stability AI Releases 1.6 Billion Parameter Language Model Stable LM 2
Stability AI has released pre-trained model weights for the Stable LM 2 language model, a 1.6B parameter model trained on 2 trillion tokens of text data from seven languages.
The model is available in two versions: the base model and an instruction-tuned version called Stable LM 2 Zephyr. [ more ]
6 Small Language Models to Get the Job Done With Ease
Small language models are scaled-down AI models that optimize for less computational power and data, making AI tools more accessible to smaller enterprises and individual developers. [ more ]
Microsoft unveils a new small language model | MarTech
Microsoft introduces cost-effective small language models, starting with Phi-3-Mini with 3.8 billion parameters and trained on 3.3 trillion tokens. [ more ]
Open Data Science - Your News Source for AI, Machine Learning & more
Stability AI Releases 1.6 Billion Parameter Language Model Stable LM 2
Stability AI has released pre-trained model weights for the Stable LM 2 language model, a 1.6B parameter model trained on 2 trillion tokens of text data from seven languages.
The model is available in two versions: the base model and an instruction-tuned version called Stable LM 2 Zephyr. [ more ]
6 Small Language Models to Get the Job Done With Ease
Small language models are scaled-down AI models that optimize for less computational power and data, making AI tools more accessible to smaller enterprises and individual developers. [ more ]